Pii: S0893-6080(98)00139-7

نویسندگان

  • R. F. Hadley
  • V. C. Cardei
چکیده

A connectionist-inspired, parallel processing network is presented which learns, on the basis of (relevantly) sparse input, to assign meaning interpretations to novel test sentences in both active and passive voice. Training and test sentences are generated from a simple recursive grammar, but once trained, the network successfully processes thousands of sentences containing deeply embedded clauses. All training is unsupervised with regard to error feedback – only Hebbian and self-organizing forms of training are employed. In addition, the active– passive distinction is acquired without any supervised provision of cues or flags (in the output layer) that indicate whether the input sentence is in active or passive sentence. In more detail: (1) The model learns on the basis of a corpus of about 1000 sentences while the set of potential test sentences contains over 100 million sentences. (2) The model generalizes its capacity to interpret active and passive sentences to substantially deeper levels of clausal embedding. (3) After training, the model satisfies criteria for strong syntactic and strong semantic systematicity that humans also satisfy. (4) Symbolic message passing occurs within the model’s output layer. This symbolic aspect reflects certain prior language acquistion assumptions. q 1999 Elsevier Science Ltd. All rights reserved.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Recurrent neural networks with trainable amplitude of activation functions

An adaptive amplitude real time recurrent learning (AARTRL) algorithm for fully connected recurrent neural networks (RNNs) employed as nonlinear adaptive filters is proposed. Such an algorithm is beneficial when dealing with signals that have rich and unknown dynamical characteristics. Following the approach from, three different cases for the algorithm are considered; a common adaptive amplitu...

متن کامل

Explanation of the "virtual input" phenomenon

We write this letter to comment on the "virtual input" phenomenon reported by Thaler (Neural Networks, 8(1) (1995) 55-65). The author attributed the phenomenon to the network's ability to perform pattern classification and completion, and reported that pruning probability affects the number of virtual inputs observed. Our independent study of Thaler's results, however, reveals a simpler explana...

متن کامل

Parallel and robust skeletonization built on self-organizing elements

A massively parallel neural architecture is suggested for the approximate computation of the skeleton of a planar shape. Numerical examples demonstrate the robustness of the method. The architecture is constructed from self-organizing elements that allow the extension of the concept of skeletonization to areas remote to image processing.

متن کامل

Model selection in neural networks

In this article, we examine how model selection in neural networks can be guided by statistical procedures such as hypothesis tests, information criteria and cross validation. The application of these methods in neural network models is discussed, paying attention especially to the identification problems encountered. We then propose five specification strategies based on different statistical ...

متن کامل

Information storage capacity of incompletely connected associative memories

In this paper, the memory capacity of incompletely connected associative memories is investigated. First, the capacity is derived for memories with fixed parameters. Optimization of the parameters yields a maximum capacity between 0.53 and 0.69 for hetero-association and half of it for autoassociation improving previously reported results. The maximum capacity grows with increasing connectivity...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999